incremental learning
- North America > United States > California > Los Angeles County > Long Beach (0.14)
- Europe > Switzerland > Zürich > Zürich (0.14)
- North America > United States > Louisiana > Orleans Parish > New Orleans (0.04)
- (15 more...)
- Information Technology > Artificial Intelligence > Vision (1.00)
- Information Technology > Artificial Intelligence > Natural Language (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks (0.93)
- Information Technology > Artificial Intelligence > Representation & Reasoning (0.67)
Enhancing Knowledge Transfer for Task Incremental Learning with Data-free Subnetwork Qiang Gao
DSN primarily seeks to transfer knowledge to the new coming task from the learned tasks by selecting the affiliated weights of a small set of neurons to be activated, including the reused neurons from prior tasks via neuron-wise masks. And it also transfers possibly valuable knowledge to the earlier tasks via data-free replay.
- Asia > China > Sichuan Province > Chengdu (0.04)
- Asia > Myanmar > Tanintharyi Region > Dawei (0.04)
- Leisure & Entertainment (0.47)
- Information Technology > Security & Privacy (0.46)
- North America > United States (0.14)
- North America > Canada (0.04)
Overleaf Example
Deep networks have shown remarkable results in the task of object detection. However, their performance suffers critical drops when they are subsequently trained on novel classes without any sample from the base classes originally used to train the model. This phenomenon is known as catastrophic forgetting. Recently, several incremental learning methods are proposed to mitigate catastrophic forgetting for object detection. Despite the effectiveness, these methods require co-occurrence of the unlabeled base classes in the training data of the novelclasses. This requirement isimpractical in manyreal-world settings since the base classes do not necessarily co-occur with the novel classes.
- Asia > China > Jiangsu Province > Yancheng (0.05)
- Asia > Myanmar > Tanintharyi Region > Dawei (0.04)
Mining
For class incremental semanticsegmentation, suchaphenomenon oftenbecomesmuchworseduetothe background shift,i.e., some concepts learned at previous stages are assigned to the background class at the current training stage, therefore, significantly reducing the performance of these old concepts. To address this issue, we propose a simple yet effective method in this paper, namedMining unseenClasses via RegionalObjectness forSegmentation (MicroSeg).
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- Europe > Netherlands > North Holland > Amsterdam (0.04)
- Asia > Middle East > Jordan (0.04)
- Asia > Middle East > Israel (0.04)